156 research outputs found

    Face2face: advancing the science of social interaction

    Get PDF
    Face-to-face interaction is core to human sociality and its evolution, and provides the environment in which most of human communication occurs. Research into the full complexities that define face-to-face interaction requires a multi-disciplinary, multi-level approach, illuminating from different perspectives how we and other species interact. This special issue showcases a wide range of approaches, bringing together detailed studies of naturalistic social-interactional behaviour with larger scale analyses for generalization, and investigations of socially contextualized cognitive and neural processes that underpin the behaviour we observe. We suggest that this integrative approach will allow us to propel forwards the science of face-to-face interaction by leading us to new paradigms and novel, more ecologically grounded and comprehensive insights into how we interact with one another and with artificial agents, how differences in psychological profiles might affect interaction, and how the capacity to socially interact develops and has evolved in the human and other species. This theme issue makes a first step into this direction, with the aim to break down disciplinary boundaries and emphasizing the value of illuminating the many facets of face-to-face interaction. This article is part of a discussion meeting issue 'Face2face: advancing the science of social interaction'

    Social top-down response modulation (STORM): a model of the control of mimicry in social interaction

    Get PDF
    As a distinct feature of human social interactions, spontaneous mimicry has been widely investigated in the past decade. Research suggests that mimicry is a subtle and flexible social behavior which plays an important role for communication and affiliation. However, fundamental questions like why and how people mimic still remain unclear. In this paper, we evaluate past theories of why people mimic and the brain systems that implement mimicry in social psychology and cognitive neuroscience. By reviewing recent behavioral and neuroimaging studies on the control of mimicry by social signals, we conclude that the subtlety and sophistication of mimicry in social contexts reflect a social top-down response modulation (STORM) which increases one's social advantage and this mechanism is most likely implemented by medial prefrontal cortex (mPFC). We suggest that this STORM account of mimicry is important for our understanding of social behavior and social cognition, and provides implications for future research in autism

    Action understanding requires the left inferior frontal cortex.

    Get PDF
    Numerous studies have established that inferior frontal cortex is active when hand actions are planned, imagined, remembered, imitated, and even observed. Furthermore, it has been proposed that these activations reflect a process of simulating the observed action to allow it to be understood and thus fully perceived. However, direct evidence for a perceptual role for left inferior frontal cortex is rare, and linguistic or motor contributions to the reported activations have not been ruled out. We used repetitive transcranial magnetic stimulation (rTMS) over inferior frontal gyrus during a perceptual weight-judgement task to test the hypothesis that this region contributes to action understanding. rTMS at this site impaired judgments of the weight of a box lifted by a person, but not judgements of the weight of a bouncing ball or of stimulus duration, and rTMS at control sites had no impact. This demonstrates that the integrity of left inferior frontal gyrus is necessary to make accurate perceptual judgments about other people's actions

    Effects of Being Watched on Eye Gaze and Facial Displays of Typical and Autistic Individuals During Conversation

    Get PDF
    Communication with others relies on coordinated exchanges of social signals, such as eye gaze and facial displays. However, this can only happen when partners are able to see each other. Although previous studies report that autistic individuals have difficulties in planning eye gaze and making facial displays during conversation, evidence from real-life dyadic tasks is scarce and mixed. Across two studies, here we investigate how eye gaze and facial displays of typical and high-functioning autistic individuals are modulated by the belief in being seen and potential to show true gaze direction. Participants were recorded with an eye-tracking and video-camera system while they completed a structured Q&A task with a confederate under three social contexts: pre-recorded video, video-call and face-to-face. Typical participants gazed less to the confederate and produced more facial displays when they were being watched and when they were speaking. Contrary to our hypotheses, eye gaze and facial motion patterns in the autistic participants were overall similar to the typical group. This suggests that high-functioning autistic participants are able to use eye gaze and facial displays as social signals. Future studies will need to investigate to what extent this reflects spontaneous behaviour or the use of compensation strategies

    Social signalling as a framework for second-person neuroscience

    Get PDF
    Despite the recent increase in second-person neuroscience research, it is still hard to understand which neurocognitive mechanisms underlie real-time social behaviours. Here, we propose that social signalling can help us understand social interactions both at the single- and two-brain level in terms of social signal exchanges between senders and receivers. First, we show how subtle manipulations of being watched provide an important tool to dissect meaningful social signals. We then focus on how social signalling can help us build testable hypotheses for second-person neuroscience with the example of imitation and gaze behaviour. Finally, we suggest that linking neural activity to specific social signals will be key to fully understand the neurocognitive systems engaged during face-to-face interactions

    Autistic adults benefit from and enjoy learning via social interaction as much as neurotypical adults do

    Get PDF
    Background: Autistic people show poor processing of social signals (i.e. about the social world). But how do they learn via social interaction? // Methods: 68 neurotypical adults and 60 autistic adults learned about obscure items (e.g. exotic animals) over Zoom (i) in a live video-call with the teacher, (ii) from a recorded learner-teacher interaction video and (iii) from a recorded teacher-alone video. Data were analysed via analysis of variance and multi-level regression models. // Results: Live teaching provided the most optimal learning condition, with no difference between groups. Enjoyment was the strongest predictor of learning: both groups enjoyed the live interaction significantly more than other condition and reported similar anxiety levels across conditions. // Limitations: Some of the autistic participants were self-diagnosed—however, further analysis where these participants were excluded showed the same results. Recruiting participants over online platforms may have introduced bias in our sample. Future work should investigate learning in social contexts via diverse sources (e.g. schools). // Conclusions: These findings advocate for a distinction between learning about the social versus learning via the social: cognitive models of autism should be revisited to consider social interaction not just as a puzzle to decode but rather a medium through which people, including neuro-diverse groups, learn about the world around them. // Trial registration: Part of this work has been pre-registered before data collection https://doi.org/10.17605/OSF.IO/5PGA

    Automatic imitation in a rich social context with virtual characters

    Get PDF
    It has been well established that people respond faster when they perform an action that is congruent with an observed action than when they respond with an incongruent action. Here we propose a new method of using interactive Virtual Characters (VCs) to test if social congruency effects can be obtained in a richer social context with sequential hand-arm actions. Two separate experiments were conducted, exploring if it is feasible to measure spatial congruency (Experiment 1) and anatomical congruency (Experiment 2) in response to a VC, compared to the same action sequence indicated by three virtual balls. In Experiment 1, we found a robust spatial congruency effect for both VC and virtual balls, modulated by a social facilitation effect for participants who felt the VC was human. In Experiment 2 which allowed for anatomical congruency, a form by congruency interaction provided evidence that participants automatically imitate the actions of the VC but do not imitate the balls. Our method and results build a bridge between studies using minimal stimuli in automatic interaction and studies of mimicry in a rich social interaction, and open new research venue for future research in the area of automatic imitation with a more ecologically valid social interaction

    The Role of Eye Gaze During Natural Social Interactions in Typical and Autistic People

    Get PDF
    Social interactions involve complex exchanges of a variety of social signals, such as gaze, facial expressions, speech and gestures. Focusing on the dual function of eye gaze, this review explores how the presence of an audience, communicative purpose and temporal dynamics of gaze allow interacting partners to achieve successful communication. First, we focus on how being watched modulates social cognition and behavior. We then show that the study of interpersonal gaze processing, particularly gaze temporal dynamics, can provide valuable understanding of social behavior in real interactions. We propose that the Interpersonal Gaze Processing model, which combines both sensing and signaling functions of eye gaze, provides a framework to make sense of gaze patterns in live interactions. Finally, we discuss how autistic individuals process the belief in being watched and interpersonal dynamics of gaze, and suggest that systematic manipulation of factors modulating gaze signaling can reveal which aspects of social eye gaze are challenging in autism

    Head Nodding and Hand Coordination Across Dyads in Different Conversational Contexts

    Get PDF
    Patrick Falk, Roser Cañigueral, Jamie A Ward et al. , 03 November 2023, PREPRINT (Version 1) available at Research Square [https://doi.org/10.21203/rs.3.rs-3526068/v1] This paper aims to explore what different patterns of head nodding and hand movement coordination mean in conversation by recording and analysing interpersonal coordination as it naturally occurs in social interactions. Understanding the timing and at which frequencies such movement behaviours occur can help us answer how and why we use these signals. Here we use high-resolution motion capture to examine three different types of two-person conversation involving different types of information-sharing, in order to explore the potential meaning and coordination of head nodding and hand motion signals. We also test if the tendency to engage in fast or slow nodding behaviour is a fixed personality trait that differs between individuals. Our results show coordinated slow nodding only in a picture-description task, which implies that this behaviour is not a universal signal of affiliation but is context driven. We also find robust fast nodding behaviour in the two contexts where novel information is exchanged. For hand movement, we find hints of low frequency coordination during one-way information sharing, but found no consistent signalling during information recall. Finally, we show that nodding is consistently driven by context but is not a useful measure of individual differences in social skills. We interpret these results in terms of theories of nonverbal communication and consider how these methods will help advance automated analyses of human conversation behaviours

    A Simple Method for Synchronising Multiple IMUs Using the Magnetometer

    Get PDF
    This paper presents a novel method to synchronise multiple IMU (inertial measurement units) devices using their onboard magnetometers. The method described uses an external electromagnetic pulse to create a known event measured by the magnetometer of multiple IMUs and in turn used to synchronise these devices. The method is applied to 4 IMU devices decreasing their de-synchronisation from 270ms when using only the RTC (real time clock) to 40ms over a 1 hour recording. It is proposed that this can be further improved to approximately 3ms by increasing the magnetometer’s sample frequency from 25Hz to 300Hz
    • 

    corecore